Goto

Collaborating Authors

 efficient modeling


Efficient Modeling of Latent Information in Supervised Learning using Gaussian Processes

Neural Information Processing Systems

Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voice recordings of multiple persons, each labeled with an ID. How could we build a model that captures the latent information related to these conditions and generalize to a new one with few data? We present a new model called Latent Variable Multiple Output Gaussian Processes (LVMOGP) that allows to jointly model multiple conditions for regression and generalize to a new condition with a few data points at test time. LVMOGP infers the posteriors of Gaussian processes together with a latent space representing the information about different conditions. We derive an efficient variational inference method for LVMOGP for which the computational complexity is as low as sparse Gaussian processes. We show that LVMOGP significantly outperforms related Gaussian process methods on various tasks with both synthetic and real data.


Reviews: Efficient Modeling of Latent Information in Supervised Learning using Gaussian Processes

Neural Information Processing Systems

I maintain my assessment and do not recommend publication at this stage. The core contribution is representing conditions with latent variables, and deriving a VI algorithm to cope with intractibility. This is interesting, but the discussion around it could be much improved. Some possible improvements are addressed in the author feedback, eg I'm not sure how Fig 1 could have been understood without the complementary explanation brought up in the feedback. Beyond what has been addressed in the author feedback, some work is needed to make this paper appealing (which the idea under study, the method and the results seem to call for): - clarifying the mathematical formulation, eg what forms of k_H are we examining, provide a full probabilistic model summary of the model, point out design choices - pointing out differences or similarities with existing work - remove gratuitous reference to deep learning in intro (it detracts) - make sure that all important questions a reader might have are addressed # Overall assessment The issue addressed (modelling univariate outputs which were generated under different, known conditions) and the modelling choice (representing conditions as latent variables) are interesting.


Efficient Modeling of Latent Information in Supervised Learning using Gaussian Processes

Dai, Zhenwen, Álvarez, Mauricio, Lawrence, Neil

Neural Information Processing Systems

Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voice recordings of multiple persons, each labeled with an ID. How could we build a model that captures the latent information related to these conditions and generalize to a new one with few data? We present a new model called Latent Variable Multiple Output Gaussian Processes (LVMOGP) that allows to jointly model multiple conditions for regression and generalize to a new condition with a few data points at test time. LVMOGP infers the posteriors of Gaussian processes together with a latent space representing the information about different conditions. We derive an efficient variational inference method for LVMOGP for which the computational complexity is as low as sparse Gaussian processes.